Generalised Pinsker Inequalities
نویسندگان
چکیده
We generalise the classical Pinsker inequality which relates variational divergence to Kullback-Liebler divergence in two ways: we consider arbitrary f -divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then develop a best possible inequality for this doubly generalised situation. Specialising our result to the classical case provides a new and tight explicit bound relating KL to variational divergence (solving a problem posed by Vajda some 40 years ago). The solution relies on exploiting a connection between divergences and the Bayes risk of a learning problem via an integral representation.
منابع مشابه
Weighted Csiszár-kullback-pinsker Inequalities and Applications to Transportation Inequalities
Abstract. We strengthen the usual Csiszár-Kullback-Pinsker inequality by allowing weights in the total variation norm; admissible weights depend on the decay of the reference probability measure. We use this result to derive transportation inequalities involving Wasserstein distances for various exponents: in particular, we recover the equivalence between a T1 inequality and the existence of a ...
متن کاملTrends to Equilibrium in Total Variation Distance
This paper presents different approaches, based on functional inequalities, to study the speed of convergence in total variation distance of ergodic diffusion processes with initial law satisfying a given integrability condition. To this end, we give a general upper bound “à la Pinsker” enabling us to study our problem firstly via usual functional inequalities (Poincaré inequality, weak Poincar...
متن کاملGeneralised monogamy relation of convex-roof extended negativity in multi-level systems
In this paper, we investigate the generalised monogamy inequalities of convex-roof extended negativity (CREN) in multi-level systems. The generalised monogamy inequalities provide the upper and lower bounds of bipartite entanglement, which are obtained by using CREN and the CREN of assistance (CRENOA). Furthermore, we show that the CREN of multi-qubit pure states satisfies some monogamy relatio...
متن کاملInformation, Divergence and Risk for Binary Experiments
We unify f -divergences, Bregman divergences, surrogate loss bounds (regret bounds), proper scoring rules, matching losses, cost curves, ROC-curves and information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their primitives which all are related to cost-sensitive binary classification. As well as clarifying relations...
متن کاملSome inequalities for information divergence and related measures of discrimination
Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/0906.1244 شماره
صفحات -
تاریخ انتشار 2009